100 research outputs found

    Data-driven spectral decomposition and forecasting of ergodic dynamical systems

    Full text link
    We develop a framework for dimension reduction, mode decomposition, and nonparametric forecasting of data generated by ergodic dynamical systems. This framework is based on a representation of the Koopman and Perron-Frobenius groups of unitary operators in a smooth orthonormal basis of the L2 space of the dynamical system, acquired from time-ordered data through the diffusion maps algorithm. Using this representation, we compute Koopman eigenfunctions through a regularized advection-diffusion operator, and employ these eigenfunctions in dimension reduction maps with projectible dynamics and high smoothness for the given observation modality. In systems with pure point spectra, we construct a decomposition of the generator of the Koopman group into mutually commuting vector fields that transform naturally under changes of observation modality, which we reconstruct in data space through a representation of the pushforward map in the Koopman eigenfunction basis. We also establish a correspondence between Koopman operators and Laplace-Beltrami operators constructed from data in Takens delay-coordinate space, and use this correspondence to provide an interpretation of diffusion-mapped delay coordinates for this class of systems. Moreover, we take advantage of a special property of the Koopman eigenfunction basis, namely that the basis elements evolve as simple harmonic oscillators, to build nonparametric forecast models for probability densities and observables. In systems with more complex spectral behavior, including mixing systems, we develop a method inspired from time change in dynamical systems to transform the generator to a new operator with potentially improved spectral properties, and use that operator for vector field decomposition and nonparametric forecasting.Comment: 56 pages, 20 figure

    Analog Forecasting with Dynamics-Adapted Kernels

    Full text link
    Analog forecasting is a nonparametric technique introduced by Lorenz in 1969 which predicts the evolution of states of a dynamical system (or observables defined on the states) by following the evolution of the sample in a historical record of observations which most closely resembles the current initial data. Here, we introduce a suite of forecasting methods which improve traditional analog forecasting by combining ideas from kernel methods developed in harmonic analysis and machine learning and state-space reconstruction for dynamical systems. A key ingredient of our approach is to replace single-analog forecasting with weighted ensembles of analogs constructed using local similarity kernels. The kernels used here employ a number of dynamics-dependent features designed to improve forecast skill, including Takens' delay-coordinate maps (to recover information in the initial data lost through partial observations) and a directional dependence on the dynamical vector field generating the data. Mathematically, our approach is closely related to kernel methods for out-of-sample extension of functions, and we discuss alternative strategies based on the Nystr\"om method and the multiscale Laplacian pyramids technique. We illustrate these techniques in applications to forecasting in a low-order deterministic model for atmospheric dynamics with chaotic metastability, and interannual-scale forecasting in the North Pacific sector of a comprehensive climate model. We find that forecasts based on kernel-weighted ensembles have significantly higher skill than the conventional approach following a single analog.Comment: submitted to Nonlinearit

    Delay-coordinate maps and the spectra of Koopman operators

    Full text link
    The Koopman operator induced by a dynamical system is inherently linear and provides an alternate method of studying many properties of the system, including attractor reconstruction and forecasting. Koopman eigenfunctions represent the non-mixing component of the dynamics. They factor the dynamics, which can be chaotic, into quasiperiodic rotations on tori. Here, we describe a method through which these eigenfunctions can be obtained from a kernel integral operator, which also annihilates the continuous spectrum. We show that incorporating a large number of delay coordinates in constructing the kernel of that operator results, in the limit of infinitely many delays, in the creation of a map into the discrete spectrum subspace of the Koopman operator. This enables efficient approximation of Koopman eigenfunctions from high-dimensional data in systems with pure point or mixed spectra

    Extraction and Prediction of Coherent Patterns in Incompressible Flows through Space-Time Koopman Analysis

    Full text link
    We develop methods for detecting and predicting the evolution of coherent spatiotemporal patterns in incompressible time-dependent fluid flows driven by ergodic dynamical systems. Our approach is based on representations of the generators of the Koopman and Perron-Frobenius groups of operators governing the evolution of observables and probability measures on Lagrangian tracers, respectively, in a smooth orthonormal basis learned from velocity field snapshots through the diffusion maps algorithm. These operators are defined on the product space between the state space of the fluid flow and the spatial domain in which the flow takes place, and as a result their eigenfunctions correspond to global space-time coherent patterns under a skew-product dynamical system. Moreover, using this data-driven representation of the generators in conjunction with Leja interpolation for matrix exponentiation, we construct model-free prediction schemes for the evolution of observables and probability densities defined on the tracers. We present applications to periodic Gaussian vortex flows and aperiodic flows generated by Lorenz 96 systems.Comment: 65 pages, 17 figures, links to accompanying videos provide

    Indo-Pacific variability on seasonal to multidecadal timescales. Part I: Intrinsic SST modes in models and observations

    Full text link
    The variability of Indo-Pacific SST on seasonal to multidecadal timescales is investigated using a recently introduced technique called nonlinear Laplacian spectral analysis (NLSA). Through this technique, drawbacks associated with ad hoc pre-filtering of the input data are avoided, enabling recovery of low-frequency and intermittent modes not previously accessible via classical approaches. Here, a multiscale hierarchy of spatiotemporal modes is identified for Indo-Pacific SST in millennial control runs of CCSM4 and CM3 and in HadISST data. On interannual timescales, a mode with spatiotemporal patterns corresponding to the fundamental component of ENSO emerges, along with ENSO-modulated annual modes consistent with combination mode theory. The ENSO combination modes also feature prominent activity in the Indian Ocean, explaining significant fraction of the SST variance in regions associated with the Indian Ocean dipole. A pattern resembling the tropospheric biennial oscillation emerges in addition to ENSO and the associated combination modes. On multidecadal timescales, the dominant NLSA mode in the model data is predominantly active in the western tropical Pacific. The interdecadal Pacific oscillation also emerges as a distinct NLSA mode, though with smaller explained variance than the western Pacific multidecadal mode. Analogous modes on interannual and decadal timescales are also identified in HadISST data for the industrial era, as well as in model data of comparable timespan, though decadal modes are either absent or of degraded quality in these datasets.Comment: 85 pages, 19 figures; submitted to Journal of Climat

    Nonlinear Laplacian spectral analysis: Capturing intermittent and low-frequency spatiotemporal patterns in high-dimensional data

    Full text link
    We present a technique for spatiotemporal data analysis called nonlinear Laplacian spectral analysis (NLSA), which generalizes singular spectrum analysis (SSA) to take into account the nonlinear manifold structure of complex data sets. The key principle underlying NLSA is that the functions used to represent temporal patterns should exhibit a degree of smoothness on the nonlinear data manifold M; a constraint absent from classical SSA. NLSA enforces such a notion of smoothness by requiring that temporal patterns belong in low-dimensional Hilbert spaces V_l spanned by the leading l Laplace-Beltrami eigenfunctions on M. These eigenfunctions can be evaluated efficiently in high ambient-space dimensions using sparse graph-theoretic algorithms. Moreover, they provide orthonormal bases to expand a family of linear maps, whose singular value decomposition leads to sets of spatiotemporal patterns at progressively finer resolution on the data manifold. The Riemannian measure of M and an adaptive graph kernel width enhances the capability of NLSA to detect important nonlinear processes, including intermittency and rare events. The minimum dimension of V_l required to capture these features while avoiding overfitting is estimated here using spectral entropy criteria.Comment: 39 pages, 8 figures, invited paper under review in Statistical Analysis and Data Minin

    The Symmetries of Image Formation by Scattering. I. Theoretical Framework

    Full text link
    We perceive the world through images formed by scattering. The ability to interpret scattering data mathematically has opened to our scrutiny the constituents of matter, the building blocks of life, and the remotest corners of the universe. Here, we deduce for the first time the fundamental symmetries underlying image formation. Intriguingly, these are similar to those of the anisotropic "Taub universe"' of general relativity, with eigenfunctions closely related to spinning tops in quantum mechanics. This opens the possibility to apply the powerful arsenal of tools developed in two major branches of physics to new problems. We augment these tools with graph-theoretic means to recover the three-dimensional structure of objects from random snapshots of unknown orientation at four orders of magnitude higher complexity than previously demonstrated. Our theoretical framework offers a potential link to recent observations on face perception in higher primates. In a later paper, we demonstrate the recovery of structure and dynamics from ultralow-signal random sightings of systems with no orientational or timing information.Comment: 16 pages, 65 references, 3 figures, 3 tables. Movies available at http://www.cims.nyu.edu/~dimitri

    Nonparametric forecasting of low-dimensional dynamical systems

    Full text link
    This letter presents a non-parametric modeling approach for forecasting stochastic dynamical systems on low-dimensional manifolds. The key idea is to represent the discrete shift maps on a smooth basis which can be obtained by the diffusion maps algorithm. In the limit of large data, this approach converges to a Galerkin projection of the semigroup solution to the underlying dynamics on a basis adapted to the invariant measure. This approach allows one to quantify uncertainties (in fact, evolve the probability distribution) for non-trivial dynamical systems with equation-free modeling. We verify our approach on various examples, ranging from an inhomogeneous anisotropic stochastic differential equation on a torus, the chaotic Lorenz three-dimensional model, and the Ni\~{n}o-3.4 data set which is used as a proxy of the El-Ni\~{n}o Southern Oscillation.Comment: Supplemental videos available at: http://personal.psu.edu/thb11

    Delay-coordinate maps, coherence, and approximate spectra of evolution operators

    Full text link
    The problem of data-driven identification of coherent observables of measure-preserving, ergodic dynamical systems is studied using kernel integral operator techniques. An approach is proposed whereby complex-valued observables with approximately cyclical behavior are constructed from a pair eigenfunctions of integral operators built from delay-coordinate mapped data. It is shown that these observables are ϵ\epsilon-approximate eigenfunctions of the Koopman evolution operator of the system, with a bound ϵ\epsilon controlled by the length of the delay-embedding window, the evolution time, and appropriate spectral gap parameters. In particular, ϵ \epsilon can be made arbitrarily small as the embedding window increases so long as the corresponding eigenvalues remain sufficiently isolated in the spectrum of the integral operator. It is also shown that the time-autocorrelation functions of such observables are ϵ\epsilon-approximate Koopman eigenvalue, exhibiting a well-defined characteristic oscillatory frequency (estimated using the Koopman generator) and a slowly-decaying modulating envelope. The results hold for measure-preserving, ergodic dynamical systems of arbitrary spectral character, including mixing systems with continuous spectrum and no non-constant Koopman eigenfunctions in L2L^2. Numerical examples reveal a coherent observable of the Lorenz 63 system whose autocorrelation function remains above 0.5 in modulus over approximately 10 Lyapunov timescales.Comment: 36 pages, 5 figure

    Quantum mechanics and data assimilation

    Full text link
    A framework for data assimilation combining aspects of operator-theoretic ergodic theory and quantum mechanics is developed. This framework adapts the Dirac--von Neumann formalism of quantum dynamics and measurement to perform sequential data assimilation (filtering) of a partially observed, measure-preserving dynamical system, using the Koopman operator on the L2L^2 space associated with the invariant measure as an analog of the Heisenberg evolution operator in quantum mechanics. In addition, the state of the data assimilation system is represented by a trace-class operator analogous to the density operator in quantum mechanics, and the assimilated observables by self-adjoint multiplication operators. An averaging approach is also introduced, rendering the spectrum of the assimilated observables discrete, and thus amenable to numerical approximation. We present a data-driven formulation of the quantum mechanical data assimilation approach, utilizing kernel methods from machine learning and delay-coordinate maps of dynamical systems to represent the evolution and measurement operators via matrices in a data-driven basis. The data-driven formulation is structurally similar to its infinite-dimensional counterpart, and shown to converge in a limit of large data under mild assumptions. Applications to periodic oscillators and the Lorenz 63 system demonstrate that the framework is able to naturally handle highly non-Gaussian statistics, complex state space geometries, and chaotic dynamics.Comment: 20 pages, 6 figure
    corecore